-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make the package compatible Node.js and others runtimes #30
Conversation
This package can now be publish in npm with |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey, thanks for this contribution! I appreciate it. I have a few small concerns at the moment, but if we can address them I'm willing to consider publishing it on npm.
transform-chunk-sizes.test.ts
Outdated
@@ -9,7 +9,7 @@ import { TransformChunkSizes } from "./transform-chunk-sizes.ts"; | |||
*/ | |||
class NumberSource extends ReadableStream<Uint8Array> { | |||
constructor(delayMs: number, chunksCount: number, bytesPerChunk = 1) { | |||
let intervalTimer: number; | |||
let intervalTimer: any; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let intervalTimer: any; | |
let intervalTimer: ReturnType<typeof setTimeout>; |
Better to use a specific type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok fixed
transform-chunk-sizes.ts
Outdated
@@ -1,36 +1,39 @@ | |||
import { Buffer } from "./deps.ts"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately something about this changed TransformChunkSizes
is causing the integration tests to fail, which means there must be a bug in how it's processing the data.
Then I discovered There is a better alternative to Buffer available in every runtime: https://sindresorhus.com/blog/goodbye-nodejs-buffer
That article is about Node's Buffer
. This repo uses the Deno Buffer
which explicitly says:
- Buffer is NOT the same thing as Node's Buffer. Node's Buffer was created in
- 2009 before JavaScript had the concept of ArrayBuffers. It's simply a
- non-standard ArrayBuffer.
What is wrong with using Deno's Buffer
implementation? It doesn't use any Deno-specific APIs as far as I know. If you just leave it in, what doesn't work?
BTW at one point Deno std decided to deprecate their Buffer
, so I actually created a branch of this that works without it, by copying its code into this repo: https://github.com/bradenmacdonald/deno-s3-lite-client/compare/no-buffer . But they changed their mind, and so I'm still using Deno std Buffer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't sure dnt will make it work, I can try
build_npm.ts
Outdated
custom: [ | ||
{ | ||
package: { | ||
name: "web-streams-polyfill", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Question: Can we just import these from node:stream/web
which is built into Node.js? That would keep the overall npm package size smaller, which is a goal for this project (small and no third party dependencies outside std lib).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it seems node:stream/web doesn't implement stream the same way:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, it might be just a type error; I wonder if it works if the type checking is disabled. But in any case, seems like it's not a straightforward change. Thanks for checking into it anyways.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i will try to run integration test
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok in fact integration test fail as well with polyfill
Please also make sure the formatter and linter issues are fixed: |
@bradenmacdonald I managed to make it work !! |
I published it here temporary: |
So after using my version in Cloudflare worker I found one last issue, in Undici, just one old import who breaks things. |
Even if I resolve this issue in local, there is another one after. |
Ok, so I finally found the best way possible! |
@bradenmacdonald can you allow the workflow tu run ? |
Ok Nodejs was working but not wrangler due to a weird bug reported here: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your work here! I've been sick for a few days but I'll try to look at this soon. I would like to play around with this myself to see if there's a way to stick with the ReadableStream.from()
method used before, because the code was shorter and more concise that way.
}, | ||
}); | ||
})(); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this ^ merging into globals will be done automatically by dnt
if you specify it in the DNT config shims.custom
:
shims: {
deno: {
test: "dev",
},
custom: [{
package: {
name: "node:stream/web",
},
globalNames: [
"ReadableStream",
"WritableStream",
"TransformStream",
],
}],
},
See denoland/dnt#362 and this integration test which checks that this approach works.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did it here, and it does work on Node.js env but break in Workers env.
That why I implemented as oak project does (made by a Deno maintainer)
My PR in undici got merged, so now if Cloudflare resolve this issue: cloudflare/workerd#1711 |
@riderx Thanks to Deno's new JSR, I've been able to publish a new version that should work on all runtimes. https://jsr.io/@bradenmacdonald/s3-lite-client https://github.com/bradenmacdonald/s3-lite-client#installation For Node.js, it requires Node 19+, because that's when they added the I've tested it on Deno, Bun, Node 20, and Firefox and it's working on all of them. Would you be able to let me know if it's working on CloudFlare workers? Thanks for all your help with this :) I did need to incorporate some of your fixes for e.g. |
Nice ! Let me try that today :) |
So for my test it seems the download url get generated but the format is not correct: <Error>
<Code>SignatureDoesNotMatch</Code>
<Message>The request signature we calculated does not match the signature you provided. Check your secret access key and signing method. </Message>
</Error> |
@riderx Hmm, I don't suppose you know what's wrong? Does it work with this/your branch, using the exact same code? Signature issues are often due to the region setting, a wrong clock (doesn't seem to be the case), or issues with URL encoding such as |
I double-checked in my fork and same issue. |
@riderx Did you have any luck figuring this out with CloudFlare workers? In the meantime, I'm going to close this issue as the new version at https://jsr.io/@bradenmacdonald/s3-lite-client is compatible with most runtimes. Feel free to open a new issue for CloudFlare compatibility or any other compatibility bugs you may find. |
so sad to see this epic PR not get merged in despite it being a good thing that JSR solves the problem however i am having some issues while trying to use pnpm + cloudfare workers firstly, rather than secondly i can install just fine with
maybe i am doing something wrong. for context my IDE node language server is node 20, despite cloudfare workers not using node. not sure if there's a way to configure IDE for v8 isolate resolution. but in any event, thanks to both you for supporting s3 on the edge! |
It seems your pnpm config is incorrect it still try to fetch from npm not jsr. |
Hi, really like your package and use it in my project Capgo.app.
My backend is deployed to multi env, Deno, Node.js and Cloudflare workers.
On Node.js, I was using Minio as an alternative, with some conversion script.
Now since I switched to import map I cannot do so.
For a while, I copied all your files into my project and removed the reference for Deno lib.
Then I discovered There is a better alternative to Buffer available in every runtime:
https://sindresorhus.com/blog/goodbye-nodejs-buffer
Then I made the script to publish in NPM and published my fork here:
https://www.npmjs.com/package/@capgo/s3-lite-client
I also created a huge PR to make minio sdk compatible with all runtimes, but the change is huge, and even all tests pass I'm not quite sure if this will be merged anytime soon.
All tests passed, and changes are the minimum possible.
Related issue: #26